Soft nearest prototype classification

نویسندگان

  • Sambu Seo
  • Mathias Bode
  • Klaus Obermayer
چکیده

We propose a new method for the construction of nearest prototype classifiers which is based on a Gaussian mixture ansatz and which can be interpreted as an annealed version of learning vector quantization (LVQ). The algorithm performs a gradient descent on a cost-function minimizing the classification error on the training set. We investigate the properties of the algorithm and assess its performance for several toy data sets and for an optical letter classification task. Results show 1) that annealing in the dispersion parameter of the Gaussian kernels improves classification accuracy; 2) that classification results are better than those obtained with standard learning vector quantization (LVQ 2.1, LVQ 3) for equal numbers of prototypes; and 3) that annealing of the width parameter improved the classification capability. Additionally, the principled approach provides an explanation of a number of features of the (heuristic) LVQ methods.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularized margin-based conditional log-likelihood loss for prototype learning

The classification performance of nearest prototype classifiers largely relies on the prototype learning algorithm. The minimum classification error (MCE) method and the soft nearest prototype classifier (SNPC) method are two important algorithms using misclassification loss. This paper proposes a new prototype learning algorithm based on the conditional log-likelihood loss (CLL), which is base...

متن کامل

Extending FSNPC to handle data points with fuzzy class assignments

In this paper we present an advanced Nearest Prototype Classification to handle data points with unsharp class assignments. Therefore we extend the Soft Nearest Prototype Classification as proposed by Seo et al. and its further enhancement working with fuzzy labeled prototypes as introduced by Villmann et al. We adapt the cost function and derive appropriate update rules for the prototypes. We ...

متن کامل

Improving nearest neighbor classification using Ensembles of Evolutionary Generated Prototype Subsets

Prototype selection reduces the dataset before the application of a classifier in order to achieve an improved accuracy and/or a considerable reduction in the number of instances. Among the proposed algorithms, evolutionary methods are the state-of-theart. In (Verbiest et al., 2016), we developed a framework to further enhance the performance of these methods with minimal additional effort. We ...

متن کامل

Selecting promising classes from generated data for an efficient multi-class nearest neighbor classification

The nearest neighbor rule is one of the most considered algorithms for supervised learning because of its simplicity and fair performance in most cases. However, this technique has a number of disadvantages, being the low computational efficiency the most prominent one. This paper presents a strategy to overcome this obstacle in multi-class classification tasks. This strategy proposes the use o...

متن کامل

On the use of evolutionary feature selection for improving fuzzy rough set based prototype selection

The k-nearest neighbors classifier is a widely used classification method that has proven to be very effective in supervised learning tasks. In this paper, a fuzzy rough set method for prototype selection, focused on optimizing the behavior of this classifier, is presented. The hybridization with an evolutionary feature selection method is considered to further improve its performance, obtainin...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE transactions on neural networks

دوره 14 2  شماره 

صفحات  -

تاریخ انتشار 2003